Nonconvex Sparse Representation With Slowly Vanishing Gradient Regularizers

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Large-Scale Low-Rank Matrix Learning with Nonconvex Regularizers

Low-rank modeling has many important applications in computer vision and machine learning. While the matrix rank is often approximated by the convex nuclear norm, the use of nonconvex low-rank regularizers has demonstrated better empirical performance. However, the resulting optimization problem is much more challenging. Recent state-of-the-art requires an expensive full SVD in each iteration. ...

متن کامل

Robust Discriminative Clustering with Sparse Regularizers

Clustering high-dimensional data often requires some form of dimensionality reduction, where clustered variables are separated from “noise-looking” variables. We cast this problem as finding a low-dimensional projection of the data which is well-clustered. This yields a one-dimensional projection in the simplest situation with two clusters, and extends naturally to a multi-label scenario for mo...

متن کامل

ADMM for Training Sparse Structural SVMs with Augmented ℓ1 Regularizers

The size |Y | of the output space Y is exponential and optimization over the entire space Y is computationally expensive. Hence in the sequential dual optimization method, the optimization of (A.6) is restricted to the set Yi = {y : αiy > 0} maintained for each example. For clarity, we present the sequential dual optimization method to solve (A.2) in Algorithm 3. The algorithm starts with Yi = ...

متن کامل

Learning Sparse Visual Representations with Leaky Capped Norm Regularizers

Sparsity inducing regularization is an important part for learning over-complete visual representations. Despite the popularity of `1 regularization, in this paper, we investigate the usage of non-convex regularizations in this problem. Our contribution consists of three parts. First, we propose the leaky capped norm regularization (LCNR), which allows model weights below a certain threshold to...

متن کامل

Sparse Reduced Rank Regression With Nonconvex Regularization

In this paper, the estimation problem for sparse reduced rank regression (SRRR) model is considered. The SRRR model is widely used for dimension reduction and variable selection with applications in signal processing, econometrics, etc. The problem is formulated to minimize the least squares loss with a sparsity-inducing penalty considering an orthogonality constraint. Convex sparsity-inducing ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Access

سال: 2020

ISSN: 2169-3536

DOI: 10.1109/access.2020.3009971